PX14A6G 1 z525230px14a6g.htm

 

Name of the Registrant: Meta Platforms, Inc.

 

Name of Person relying on exemption - Christina OConnell

 

Written materials are submitted pursuant to Rule 14a-6(g) (1) promulgated under the Securities Exchange Act of 1934. Submission is not required of this filer under the terms of the Rule, but is made voluntarily in the interest of public disclosure and consideration of these important issues.

 

The proponents urge you to vote FOR the Shareholder Proposal calling for a Report on Allegations of Political Entanglement and Content Management Biases in India, Item 7 at the Meta Platforms, Inc. Annual Meeting of Shareholders on May 31 2023

 

To Meta Shareholders:

Eko (formerly SumOfUs) urges you to vote FOR proposal 7 at the annual meeting of Metas shareholders on May 31, 2023. The Proposal requests that the Board of Directors commission a report assessing allegations of biased operations in Meta's largest market, India,

 

   
 

 

Proposal Seven: Shareholder Proposal Regarding Report on Allegations of Political Entanglement and Content Management Biases in India

 

The proponent of this resolution is Mari Mennel-Bell.

 

Assessing Allegations of Biased Operations

in Meta's Largest Market

 

Whereas: Metas largest user base is in India, with over half a billion Indians using Meta services.”1 Facebook is apparently a critical catalyst of religious violence in India from disseminating anti-Muslim hate speech, and failing to flag posts and speakers who pose risks in this regard.

 

For instance, in February 2020, Muslim-majority neighborhoods of north-east Delhi were stormed by a mob, destroying mosques, shops, homes and cars, and killing 53 people. In months preceding the massacre, the head of a powerful North Indian temple videoed a speech onto Facebook, declaring I want to eliminate Muslims and Islam from the face of the Earth.” It has been viewed well over 40 million times.

 

According to the Wall Street Journal, Facebooks top policy official in India, Ankhi Das, pushed back against employees wanting to label BJP politician T. Raja Singh dangerous” and to ban him from the platform after he used Facebook to call Muslims traitors, threaten to raze mosques, and call for Muslim immigrants to be shot. Das argued that punishing Singh would hurt Facebooks business in India.2

 

Facebook India's top remaining employee has ties to the BJP. Shivnath Thukral, who now heads public policy across all India platforms after resignations of other top personnel, assisted in BJP's 2014 election campaign. Al Jazeera reported that Facebook provided preferential rates for political advertisements of the BJP, and permitted surrogate advertising supporting BJP, suggesting partisan bias.

 

Further, content moderation in India is undercut by poor capacity of Metas misinformation classifiers” (algorithms) and its human moderators to recognize many of Indias 22 officially recognized languages.3

 

In 2019, Meta commissioned law firm Foley Hoag for a Human Rights Impact Assessment (HRIA) of its India operations. The four page summary released by Meta provides scant transparency and explicitly acknowledged the assessment did not assess or reach conclusions” about whether India operations had bias in content moderation.4

 

The proponent believes Metas lack of transparency concerning India presents a clear and present danger to the Companys reputation, operations and investors.

 

   
 

 

RESOLVED:

 

Shareholders request that the Company commission a nonpartisan assessment of allegations of political entanglement and content management biases in its operations in India, focusing on how the platform has been utilized to foment ethnic and religious conflict and hatred, and disclose results in a report to investors, at reasonable expense and excluding proprietary and privileged information. Among other things, the assessment can evaluate:

 

•Evidence of political biases in Company activities, and any steps to ensure it is non-partisan;

 

•Whether content management algorithms and personnel in India are at scale and multilingual capacity necessary to curtail mass dissemination of hate speech and disinformation;

 

•The relevance of any evidence germane to biases, exposures, and impact disclosed in the previously commissioned India HRIA, as investors have been unable to read the full recommendations.

 

1 https://techcrunch.com/2022/11/16/meta-appoints-new-india-head-amid-key-departures/

2 https://www.wsj.com/articles/facebook-hate-speech-india-politics-muslim-hindu-modi-zuckerberg-11597423346

3 https://slate.com/technology/2021/10/facebook-papers-india-modi-misinformation-rss-bjp.html

 

 

Analysis:

 

Metas largest market, India, plays a mission critical role in the growth and development of our company. As TechCrunch recently noted:1

 

With over half a billion Indians using Meta services, the American giant identifies India as its largest market by users.

 

Facebooks family of apps, including Instagram and WhatsApp, have grown fastest in India in recent years, onboarding hundreds of millions of users. It has also made a series of ambitious investments in the country, including cutting a $5.7 billion check to Indian telecom giant Jio Platforms and ramping up the commerce engine of WhatsApp.

 

India is at the forefront of digital adoption and Meta has launched many of our top products, such as Reels and Business Messaging, in India first…” said Marne Levine, chief business officer of Meta, in a statement.

 

_____________________________

1 https://techcrunch.com/2022/11/16/meta-appoints-new-india-head-amid-key-departures/

 

   
 

 

This critical market however is deeply enmeshed in volatile political and religious conflicts, conflicts which often are promoted and amplified using our companys platforms in violation of our companys terms of service and policies. According to Metas own internal reports as reported:2

 

Inflammatory content on Facebook spiked 300% above previous levels at times during the months following December 2019, a period in which religious protests swept India, researchers wrote in a July 2020 report that was reviewed by The Wall Street Journal.

 

The Journal also noted:

 

Hindu and Muslim users in India say they are subjected to a large amount of content that encourages conflict, hatred and violence on Facebook and WhatsApp,” such as material blaming Muslims for the spread of Covid-19 and assertions that Muslim men are targeting Hindu women for marriage as a form of Muslim takeover” of the country, the researchers found…

 

There is so much hatred going on” on Facebook, one Muslim man in Mumbai was quoted as telling the researchers, saying he feared for his life. Its scary, its really scary.”

 

Such fears are well founded. As TIME reports:3

 

WhatsApp, too, has been used with deadly intent in India — for example by cow vigilantes, Hindu mobs that have attacked Muslims and Dalits accused of killing cows, an animal sacred in Hinduism. At least 44 people, most of them Muslims, were killed by cow vigilantes between May 2015 and December 2018, according to Human Rights Watch. Many cow vigilante murders happen after rumors spread on WhatsApp, and videos of lynchings and beatings are often shared via the app too.

 

Following the 2020 attacks, Middle East Eye summarized our company’s actions as follows:4

 

After the violence in Delhi in 2020, Facebook came under scrutiny. The Delhi Assembly Committee on Peace and Harmony, appointed by Delhis Legislative Assembly to resolve issues between religious and social groups, looked into Facebooks involvement in the violence. But Facebook refused to appear before the committee: Ajit Mohan, the head of Facebook India, said that the assembly should not interfere in a law and order issue.

 

_____________________________

2 https://www.wsj.com/articles/facebook-services-are-used-to-spread-religious-hatred-in-india-internal-documents-show-11635016354

3 https://time.com/5883993/india-facebook-hate-speech-bjp/

4 https://www.middleeasteye.net/big-story/facebook-meta-india-muslims-allow-hate-speech

 

   
 

 

The response from Raghav Chadha, the committees chairman, was withering. He accused Mohan of an attempt to conceal crucial facts in relation to Facebooks role in the February 2020 Delhi communal violence” and accused Facebook itself of running away” from scrutiny. In September 2020, the committee prima facie found Facebook complicit” in the riots, stating that the company sheltered offensive and hateful content on its platform”.

 

The publication continues;

 

Less than a month later, Ajit Mohan, the head of Facebook India, was questioned by a parliamentary panel. During the hearing, opposition politicians criticised Facebook for failing to remove inflammatory content posted by accounts connected to the party.

 

Since then, further evidence has emerged of the close relationship between Facebook and the BJP. In March 2022, for instance, a joint analysis by The Reporters Collective and ad.watch of 536,070 political advertisements in India concluded that between February 2019 and November 2020, Facebook charged the BJP less for election adverts than other political parties. On average, Congress, Indias largest opposition party, paid nearly 29 percent more to show an advert a million times than the BJP did.

 

The report also stated that Facebook allowed surrogate advertisers to promote the BJP in elections - while blocking nearly all surrogate advertisers for the rival Congress Party.

 

Metas close association with one Indian political party has been reported multiple times and reaches beyond favoritism in advertising rates. Facebook India staff has been led by former operatives of the ruling BJP party.5

 

The way it has applied its hate-speech rules to prominent Hindu nationalists in India, though, suggests that political considerations also enter into the calculus…

 

Ms. Ankhi Das joined Facebook in 2011. As public-policy head for India, South and Central Asia, she oversees a team that decides what content is allowed on the platform, one of the former employees said…

 

_____________________________

5 https://www.wsj.com/articles/facebook-hate-speech-india-politics-muslim-hindu-modi-zuckerberg-11597423346

 

   
 

 

That team took no action after BJP politicians posted content accusing Muslims of intentionally spreading the coronavirus, plotting against the nation and waging a love jihad” campaign by seeking to marry Hindu women, that former employee said.

 

Additionally the Wall Street Journal reporting on internal Meta research notes that content from the Modi-associated RSS party is not moderated in lines with our companys policies :

 

The researchers, in a report early this year called Adversarial Harmful Networks: India Case Study,” said that much of the content posted by users, groups and pages from the Hindu nationalist Rashtriya Swayamsevak Sangh group, or RSS, is never flagged…

 

Researchers also found pro-RSS users on Facebook were posting a high volume of content about Love Jihad,” a conspiracy theory that is spreading online and has gained traction in recent years. Proponents of the theory claim there is a plot by Muslim men to lure Hindu women with the promise of marriage or love, in order to convert these women to Islam.

 

Other material highlighted as inflammatory that Facebook found to be propagated by pro-RSS entities included assertions that Muslim clerics spit on food to either make it halal,or spread Covid-19, as a larger war against Hindus,” the researchers wrote.

 

There were a number of dehumanizing posts comparing Muslims to pigs” and dogs” and misinformation claiming the Quran calls for men to rape their female family members. Facebook hasnt designated the RSS for removal given political sensitivities,” the document says. Prime Minister Narendra Modi worked for the RSS for decades.

 

As both internal and external reporting pointed to troubling tolerance for content contributing to serious violence and to improperly close ties to the BJP government, Meta commissioned a Human Rights Impact Assessment in which Indian civil society leaders participated, often at great risk. Meta however has refused to release this report or to disclose the findings and mitigation recommendations, in contravention of the UN Guiding Principles and the GNI standards, both of which emphasize the importance of disclosure of the results of such assessments. (emphasis added)

 

   
 

 

Meta has suggested to reporters and advocates that disclosure of the existing report presents security concerns, itself a troubling claim that suggests significant risks to this critical market and a growth strategy heavily reliant on operations in this market. To justify such a lack of transparency, Meta has repeatedly referenced the Danish Institute Guidelines which Meta claims allows for such secrecy. However, the Institutes guideline actually notes that such a lack of disclosure is only a temporary option.6

 

Meta has presented only a four-page summary statement. As TIME reports:7

 

The India HRIA was carried out by an independent law firm, Foley Hoag, which interviewed more than 40 civil society stakeholders, activists, and journalists to complete the report. But Facebook drew criticism from rights groups on Thursday after it released its own four-page summary of the law firms findings that was almost bereft of any meaningful details…

 

TIME notes that:

 

Facebook adds that the full report does not make any judgment on the most contentious allegation stemming from the Das controversy in 2020: that its moderation of hateful content in India is biased toward the ruling party so as to maintain market access. The assessors [Foley Hoag] noted that civil society stakeholders raised several allegations of bias in content moderation,” Facebooks summary of the report says. The assessors did not assess or reach conclusions about whether such bias existed.” (emphasis added)

 

It is precisely this critical area that our resolution addresses. As investors, it is essential that we have the ability to evaluate potential collusion in our companys largest market.

 

 

One of the stakeholders who participated in the assessment, Apar Gupta, will present Item #7 at the May 31st Meta Annual meeting. Mr. Gupta is the Founding Director of the Internet Freedom Foundation:

 

Today, my statement draws from the work of grassroots activists, human rights defenders, journalists, gender minorities and lowered caste groups of the most populous country in the world. I salute their courage and recognise their wounds.

 

Today, a crisis affects Meta's reputation, operations, ESG commitments, and, ultimately, your investments. Meta platforms are used by many, if not all Indians and India is Metas largest market. This widespread use, by its very nature, bears the weight of social responsibility by Meta.

 

_____________________________

6 https://www.humanrights.dk/files/media/document/Phase%205_Reporting%20and%20Evaluation_ENG_accessbile.pdf

7 https://time.com/6197154/facebook-india-human-rights/

 

   
 

 

Today, Meta is failing its obligations to the Indian republic and in its duty to its shareholders to make full and accurate disclosures regarding their investments. One prominent instance is when New Delhi suffered communal clashes in 2020 that resulted in 53 deaths. When a fact-finding report pointed towards the role of Meta, the company resisted a summons by lawmakers and had to be directed by the Indian Supreme Court to appear. The court observed, Facebook has become a platform for disruptive messages, voices and ideologies”. There is a pattern of Meta avoiding transparency and accountability to its users, public institutions and its shareholders.

 

Today, Metas Board response would have you believe it makes extensive disclosures” and has published an independent human rights impact assessment (HRIA) regarding potential human rights risks in India”. This is a materially false statement. The HRIA report has been withheld despite demands by numerous civil society stakeholders, including me, who were interviewed. Metas four page disclosures on the HRIA are self-serving, surprising even stupefying. They contradict the testimony provided by me and is at substantial variance with the release of other HRIAs.

 

Metas cited justification for non-disclosure is contrary to the UN Guiding Principles as interpreted by the Danish Institute for Human Rights, which expressly states, such alternatives should be interim measures while companies work towards full disclosure of HRIA processes and findings.” We have now waited three years and still there is silence from Meta. This is all the more troubling since India is set to witness a polarizing general election in May, 2024.

 

Today, you all have a historic opportunity to prevent complicity in human rights abuse. I urge you to vote for this resolution. Although dual class shares make it impossible to secure a majority of votes, let it not prevent you from hearing the call of your conscience for the people of India. Please head the evocation of Prime Minister Nehru on the eve of Indias independence, Peace has been said to be indivisible; so is freedom, so is prosperity now, and so also is disaster in this one world that can no longer be split into isolated fragments.

 

   
 

 

 

 

This is not a solicitation of authority to vote your proxy. Please DO NOT send us your proxy card; Eko (formerly SumOfUs) is not able to vote your proxies, nor does this communication contemplate such an event. Eko (formerly SumOfUs) urges shareholders to vote for Item No. 7 following the instructions provided on management's proxy mailing.

 

The views expressed are those of the authors and Eko (formerly SumOfUs) as of the date referenced and are subject to change at any time based on market or other conditions. These views are not intended to be a forecast of future events or a guarantee of future results. These views may not be relied upon as investment advice. The information provided in this material should not be considered a recommendation to buy or sell any of the securities mentioned. It should not be assumed that investments in such securities have been or will be profitable. This piece is for informational purposes and should not be construed as a research report.